From: route@monster.com
Sent: Tuesday, June 04, 2013 3:54 PM
To: hg@apeironinc.com
Subject: Please review this candidate for: Big Data
This resume has been forwarded to
you at the request of Monster User xapeix01
|
|||||||
|
|||||||
|
|
|
||||||
|
||||||
|
Gunjan Mishra An
accomplished IT professional with 12+ years of experience in Analysis ,
Design, Development, Implementation and Support of Complex software
solutions in Health , Finance , Manufacturing and Educational domain. Qualification
Summary (With Expertise in): ·
Designing
and developing solutions for senior Data Warehousing/Decision Support
Systems, Data integration, Data Migration, ETL process and Business
Intelligence using Tools (IBM Data Stage, Informatica, Oracle Data
Integrator, Oracle Warehouse Builder (OWB)), Oracle SQL, PL/SQL Packages, Constraints,
Functions, Packages, Indexes, Triggers, Oracle SQL Database Objects, Views,
Materialized Views, Business Objects 5.1, Cognos on relational database
systems such as Oracle and MS SQL Server. ·
Oracle
internal architecture, Data dictionaries and system packages, Enterprise Data
Warehouse Archiving Techniques, Oracle Streams, Oracle Advance Queuing
Techniques and Oracle’s Heterogeneous Services for clustered systems with
load balancing techniques. ·
Oracle
Data base support and troubleshooting , including installing Oracle software,
patches and upgrades; managing and monitoring database performance ,
Configuring and managing RMAN backup and recovery, performance tuning using
SQL Trace, Explain plan, Hints, partitioning ,TKProf, profiling and Oracle
Code coverage utilities. ·
Working
with MS SQL Server 2008 R2, SSIS packages, SSRS reports, T-SQL and supporting
Applications with SQL server backend. ·
Data
modeling experience in creating Conceptual Data Model, Partially Attributed
Conceptual Data Models, Logical Data Models and Physical Data Models using
ER-WIN. Defining and managing Metadata for Data Marts/Data Warehouses. ·
Designing
report layouts and developing EIS, Ad-hoc, and standard reports using
business intelligence tools. ·
Oracle
SOA Suite and Oracle Fusion middleware components. ·
Customizing,
developing, working on extensions with Oracle Applications in
client/server and n-tier architecture, deployment of
applications on web server ·
Working
with Hadoop ecosystem: HDFS, MapReduce, Pig, Hive, HBase, Oozie, Sqoop, Flume
and Zookeeper. ·
Excellent
communication and interpersonal skills, ability to work in a team or
individually and ability to ramp up to new technologies quickly. Can
help client with, ·
Complete
lifecycle implementation of Data warehousing project design & automation
of ETL processes on UNIX /Linux/ Windows environments
with error handling design. ·
Application
Support with backend Integration services running on MS SQL Server, ·
Big
Data solutions, setting up and administration of hadoop clusters,
Upgrades/migration , performance tuning , trouble shooting, incident
management ·
Oracle
Database PLSQL Development /Oracle Applications Development/Support ·
Data
Modeling (Logical and Physical) – 3NF as well as Dimensional (Star Schema) Re-engineering Data
Warehouses ·
Large-scale
data conversion and migration projects (for example SQL Server to Oracle). ·
Moving
data from a transactional to analytical Processing model ·
Establishing
Architecture and Governance policies & procedures, Production rollout and
transition plans ·
Data
warehouse platform (ETL, Reporting, Modeling, Query tools) selection ·
Facilitating
JAD sessions and requirement gathering workshops ·
Using
use cases (UML) or SDLC processes, SOA (Service Oriented architecture), MDM
(Master Data Management), and SOX ·
Database
administration for Oracle single server or Oracle Real Application Clustered
environment with Oracle Golden Gate Technology for High Availability and data
replication along with Data Guard for disaster recovery. ·
Designing
report layouts for EIS, Ad-hoc, standard reports, Web Page layouts for EIS ·
Implementation
of Repository Metadata for Data Warehouses. ·
Business
Information Model Technical
Skills: ·
Databases:
Oracle (11g R2) , MS SQL Server, DB2 ·
Database
Administration: Real Application cluster, Golden Gate , Data Guard ·
Data
Modeling Tools – ERWin, Visio ·
ETL/BI
tools and IDE’s : IBM Websphare Data Stage Quality Stage/Director, Cognos
Reportnet, Informatica, AB Initio, Oracle Warehouse Builder (OWB), Oracle
Developer Suit 10g, Oracle Jdeveloper ADF, Sybase V11 ·
Operating
System: Windows Family, Unix , Linux ·
Web
Technologies: HTML, XML, XSLT ·
Version
Control: VSS, PVCS ·
Language
and other Applications : Perl , Ruby on Rails , SQL, PL/SQL,C,C++,UNIX Shell
scripting, Toad, PL/SQL Developer, Office XP, MS Visio, MS Access ·
High
Level Familiarity with Java/J2EE Detailed
Professional Experience New
York City Department of Education (NYCDOE), Brooklyn, NY, Jan’09- Feb’ 2013 Sr.
Oracle ETL Developer/Data Architect The
New York City Department of Education is the largest system of public schools
in the United States, serving about 1.1 million students in over 1,600
schools. It is creating an Operational Data Store (ODS) for the New York City
Department of Education (NYCDOE) Special Education Student Information System
(SESIS). The project vision for the ODS is to establish a data hub to
bring in data from up to 30 source systems together so that a broader 360
degree view of the Special Education students is available. Responsibilities: ·
Redefining
an architectural blueprint for the next generation NYCDOE enterprise systems. ·
Working
on creating an end to end data management component that will define the data
life cycle across the enterprise to leverage and improve existing data and to
optimize the business processes of the NYCDOE future. ·
Creation
of Operational Data Store as a subcomponent of the larger architecture and
data management system while adhering to the architectural principles defined
by the Enterprise Architecture group in NYCDOE ·
Creation
of Initial solution blue print by analyzing complex legacy systems, working
with SME’s and reverse engineering codes from applications built on legacy. ·
Integration
of TIENET services (The system used by schools and users for Special Ed
Education Process) with Legacy systems, with SQL Servers and SSIS Packages ·
Designing
and building fully integrated real time student services for the processing
of the updates from legacy systems through web services, ESB and PL/SQL
packages. ·
Developing
the architected solutions with IBM data stage for initial and daily load.
This was done using DataStage Designer to design and develop jobs for
extracting, cleansing, transforming, integrating, and loading
data using various stages like Aggregator, Funnel, Change Capture, Change
Apply ,Sequential file, Hash file, Transformer, Merge, Join and Lookup. ·
Working
with project metadata, operational metadata and design metadata. Using
parallelism and pipelining to rapidly handle high volumes of work. ·
Using
DataStage Director and its run-time engine to schedule and run the parallel
jobs, testing and debugging its components and monitoring the resulting
executable versions on an ad hoc or scheduled basis. ·
Designing
job sequences to automate the process and document all the job dependencies,
predecessor jobs. ·
Designed
and developed a process for extracting monthly special education student’s
reports. ·
Setting
up Initial Oracle testing server boxes. (Verizon provided hardware and
network operations support). ·
Oracle
database maintenance and
support.
·
Developing,
modifying and customizing SSIS packages, SSRS reports on MS SQL Server 2008
R2 used for backend of TIENET Application. Writing Stored Procedures and
Working with T-SQL queries. Environment IBM InfoSphere
Information Server 8.1(DataStage, QualityStage, Information Analyzer,
Metadata Workbench, Business Glossary) ,Oracle 11g/10g, SQL, PL/SQL, SQL
* Loader, XML, UNIX shell scripts, Windows XP, TOAD, Oracle SQL
Developer, MS SQL Server 2008 R2, MS Access DB, Microsoft Share
point Access, Windows ftp utilities client , PVCS (for version
control). Microsoft SSIS and SSRS Tracfone
Wireless Inc., Miami, FL,
Nov’08-Jan’09
Oracle
Data warehousing Consultant. Tracfone
wireless is America’s largest “No-Contract” cellular service provider in the
U.S. with over 15 million subscribers. It is the fourth largest cell phone
company in the world and the largest in all of the Americas with more than
200 million cell phone subscribers. Responsibilities: ·
Primary
responsibilities included working on the requests coming in from sales
division which was making changes to the existing process in order to
accommodate more users and faster decision making process. That required
making appropriate modifications to their existing code packages with
procedures, functions to reflect the correct changes. ·
Active
involvement in discussions with other team members to enhance the overall
architectural solution. ·
Worked
on the debugging of the existing PL/SQL procedures, functions and packages. ·
Wrote
the shell scripts for effectively deploying new developments to production.
Environment
Oracle 10g, SQL, PL/SQL, SQL * Loader, XML, UNIX shell scripts, Windows XP,
TOAD, Oracle SQL Developer, MS SQL Server, MS Access DB,
Microsoft Share point Access, Windows ftp utilities client , IBM's LEI,
PVCS (for version control). Malt-O-Meal,
Minneapolis, MN, May’08-Sep’08
Oracle
Application/Data warehouse consultant Malt-o-Meal
is one of the top 5 cereal manufacturing company in USA with their
manufacturing facilities and distribution centers all across USA. The company
has its head quarters located in Minneapolis, MN. Responsibilities: ·
Primary
responsibilities include the back end support (writing and modifying existing
packages, procedures, functions and triggers) for customized ORACLE
Application called COSMOS (Customer orders management systems) with working
with Oracle forms. ·
Enhancement
of the overall Decision Support System. Making enhancement by writing
PL/SQL functions, procedures, packages, triggers, dB links, sequences, views,
materialized views to best utilize the CDC (change data capture)
functionality of oracle 9i in batch processes and successfully reduced
runtime for various CTAS process leading up to the staging schema. ·
Design
and developed labor fact table and dimension tables in data warehouse, also
used existing dimensions like date reference for labor budget vs. actual
through Lawson (it’s an ERP application used by HR). ·
LEI
extraction was done using an IBM tool and data was being loaded through that
extract into a staging area and from there it was being loaded into DW,
through a package, modified that package. By introducing different parameters
for tables involved in the staging area. ·
Wrote
shell scripts for running/rerunning of the ETL jobs, scheduled jobs were run
through Maestro scheduling tool. Environment
Oracle 9i/10g, SQL, PL/SQL, SQL * Loader, XML, UNIX shell scripts, Windows
XP, TOAD, PL/SQL Developer, Oracle Warehouse Builder 10g, MS SQL Server, MS
Access DB, cognos metrics studio, Windows ftp utilities client , IBM's LEI,
PVCS (for version control). Humana
Inc. Louisville, KY, Nov’2005-Apr’08 Oracle
(PL/SQL) Developer/Analyst/Data warehouse/Data mart Consultant Humana
is one of the nation’s largest health insurance providers and is a Fortune
500 company. Humana provides wide range of healthcare services and programs
through traditional and internet-based plans to more than 7 million members
with an established reputation for quality and superior service. Sales
Data mart Architectural
Enhancement
Responsibilities: ·
Developed
shell scripts with embedded PL/SQL procedures to track and
monitor data flow , from live enrollment system to Data mart (Oracle 10g
running on AIX) ·
Wrote
PL/SQL functions, procedures and packages for the data mart process
enhancements , used existing reporting tables to create an automated process
which would raise alerts about any drop in data from one staging table to
another while live data and platform loads make their way to Data
mart. ·
Develop
an automated process for retriggering and reprocessing the dropped data,
automated the daily platform load process which was kicked off manually every
day. ·
Created
an automated process for GSU (Group Setup Utility) delete and refresh in
conjunction with the front end developers. Used bulk collection , associative
arrays , nested tables , for bulk processing of data which resulted in a
performance improvement PDE
Error
Tracking
Responsibilities: ·
Developing
scripts for PDE Errors related ad/hoc requests submitted. The scripts are
created in the ad/hoc schema CI_MBRSHIP. Recurring requests are moved to
production after data testing. ·
Worked
on the retro member related project, developed scripts for that, these
scripts are being run manually and load the tables in the ad/hoc schema,
these tables are then imported in the MS Access database which queries and
delivers the data to user in an easy format. The database is created every
week and is sent over. ·
Have
been working on the monthly finance table uploads. Where we archive the data
for past three months along with loading the data for the current month. ·
Extensively
worked on the Claims data for errors, LIS status and DIR rebates. ·
Worked
on the data reconciliation between our system and Argus for pharmacy rebates.
Medicare
Reporting
Responsibilities: ·
Daily
operational reporting of Medicare Enrollment Summary (for year 2006) (CI
System Comparison to Downstream Processes). The Medicare enrollment data
flows from the legacy systems to the EDWPRO through daily mainframe jobs,
once the data has been loaded Medicare scripts which were written and moved
to production populate the other tables required daily operational reporting.
The daily reporting included following. ·
CI
membership data based on SSN, card data based on SSN and UMID, CI Argus data
based on SSN. Summary and detail of No Card, Request cancelled and in
progress card counts based on SSN and UMID. ·
Change
report summary and detail (CI System Change Report Summary) from the previous
version. Member data change, Card data change and Argus data change. ·
Channel
reporting (inventory through various channels). Membership roll forward
summary. Reason code summary. ·
With
OWB10gR2 created source modules that import data from flat files, relational
databases and these import routines consist of SQL*Loader scripts,
external table definitions, database links through to Oracle databases and
other databases accessed. C3
Scorecard Ad/hoc
requests:
Responsibilities
·
Created
PL/SQL triggers, procedures, functions for processing business logic and
gather the data from the database to be used in the reports. Reporting
involved generating summaries on a monthly, quarterly, half-yearly and annual
basis. ·
Extensively
involved in the performance tuning of the queries. Successfully reduced the
run time of various scripts after effectively tuning them. Method used were
partitioning and indexing of the tables for better performance and faster
retrieval of data and for the optimization of the query performance. Environment
Oracle 9i/10g, SQL, PL/SQL, SQL * Loader, UNIX shell scripts, Windows XP,
TOAD, PL/SQL Developer, Oracle Forms 9i, Reports 9i, Oracle Discoverer 10g,
Oracle Warehouse Builder 10g, MS SQL Server, MS Access DB, Windows ftp
utilities client CompuCredit,
ATLANTA, GA. May’04 to Sep ‘05 ETL Developer CompuCredit is an
information and technology driven provider and direct marketer of branded
credit cards and related fee-based products and services offered to
prospective customers on an unsecured basis. Responsibilities: ·
Performed
data analysis based on the source systems and existing OLTP systems ·
Designed
and built mappings and reusable objects using Informatica PowerCenter 6.2. ·
Extensively
used Transformations like expression, Joiner for heterogeneous sources, and
Look-up, Filter, Normalizer, Aggregator, Update Strategy to transform data
into the interface tables and base tables. ·
Created
reusable transformations and mapplets to build error free, reusable objects. ·
Used
Informatica functions like LTRIM, RTRIM, ISNULL, DECODE, IIF, TO_DATE
extensively in the transformations. ·
Worked
extensively on Informatica repository manager, Designer, Workflow Manager to
create, schedule, monitor and validate data models defined in the ER diagram. ·
Involved
in extensive Performance Tuning by determining bottlenecks in sources,
targets, mappings and sessions. ·
Performed
unit testing to validate mappings, debugged mappings for failed sessions and
populated the database. Used the existing mapping in debug mode extensively
for error identification by creating break points and monitoring the debug
monitor. Environment
PowerCenter 6.2(Informatica Designer, Workflow
manager, Workflow monitor 6.2, Informatica Repository Manager), Oracle
9i, Sun Sparc, Toad 7.6 Other
Clients Rothschild Inc., NY, Nov‘03–March 2004, Data warehousing
Consultant Environment: PowerCenter 6.2(Informatica Designer, Workflow manager, Workflow
monitor 6.2, Informatica Repository Manager), Oracle 9i, Sun Sparc, Toad 7.6 Washington Mutual, IL, May’03 TO Nov’03, ETL Developer Environment Oracle 8.x/9.x, SQL Server2000, INFORMATICA 5.X/6.x (Power
Center / POWER MART), SQLloader, Erwin, Shell Scripts, WindowsNT/2000 Target Corporation, MN, Feb’ 02-Apr’ 03 Environment Informatica Power Center 5.1/6.2, Business Objects 5.1, Oracle
9i, shell Scripts, Sun-Solaris 5.8 and Windows NT, MS SQL Server 7.0, MS
Access
Sprint, KS, Jan’01 to Dec’01, ETL Developer
Environment
Informatica Power Center5.1, PL/SQL, MS SQL, Oracle8i, Windows
NT, UNIX Korn Shell Scripts Daewoo Motors India Limited, INDIA, July ’98 to Nov’ 00, Oracle
Application Developer Environment Windows NT Client/Server, HP-Unix, Oracle 8, Developer 2000
(Rel. 2), Forms 4.5, Reports2.5, SQL*Plus, SQL, PL/SQL. Education Indian Institute of Technology, Roorkee, India 1998 |
|
|
||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||
|
|